224 research outputs found

    Modeling trajectories of perceived leg exertion during maximal cycle ergometer exercise in children and adolescents

    Get PDF
    BACKGROUND: Borg developed scales for rating pain and perceived exertion in adults that have also been used in pediatric populations. Models describing functional relationships between perceived exertion and work capacity have not been studied in children. We compared different models and their fits to individual trajectories and assessed the variability in these trajectories. METHODS: Ratings of perceived exertion (RPE) were collected from 79 children. Progressive cycle ergonometric testing was performed to maximal work capacity with test duration ranging from 6‐ 12 minutes. Ratings were obtained during each 1‐minute increment. Work was normalized to individual maximal work capacity (Wmax). A delay was defined as the fraction of Wmax at which point an increase in ratings of leg fatigue occurred. Such a delay term allows the characterization of trajectories for children whose ratings were initially constant with increasing work. Two models were considered, a delay model and a power model that is commonly used to analyze Borg ratings. Individual model fit was assessed with root mean squared error (RMSE). Functional clustering algorithms were used to identify patterns. RESULTS: Leg tiredness developed quickly for some children while for others there was a delay before an in‐ creased ratings of leg exertion occurred with increasing work. Models for individual trajectories with the smallest RMSE included a delay and a quadratic term (quadratic‐delay model), or a power function and a delay term (power‐delay model) compared to a simple power function. The median delay was 40% Wmax (interquartile range (IQR): 26‐49%) in a quadratic‐delay model, while the median exponent was 1.03 (IQR: 0.83‐1.78) in a power‐delay model. Nine clusters were identified showing linear or quadratic patterns with or without a delay. Cluster membership did not depend on age, gender or diagnosis. CONCLUSION: Children and adolescents vary widely in their capacity to rate their perceptions and exhibit different functional relationships between ratings of perceived exertion and work capacity normalized across individuals. Models including a delay term, a linear component, or a power function can describe these individual trajectories of perceived leg exertion during incremental exercise to voluntary exhaustion

    Spatial normalization improves the quality of genotype calling for Affymetrix SNP 6.0 arrays

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Microarray measurements are susceptible to a variety of experimental artifacts, some of which give rise to systematic biases that are spatially dependent in a unique way on each chip. It is likely that such artifacts affect many SNP arrays, but the normalization methods used in currently available genotyping algorithms make no attempt at spatial bias correction. Here, we propose an effective single-chip spatial bias removal procedure for Affymetrix 6.0 SNP arrays or platforms with similar design features. This procedure deals with both extreme and subtle biases and is intended to be applied before standard genotype calling algorithms.</p> <p>Results</p> <p>Application of the spatial bias adjustments on HapMap samples resulted in higher genotype call rates with equal or even better accuracy for thousands of SNPs. Consequently the normalization procedure is expected to lead to more meaningful biological inferences and could be valuable for genome-wide SNP analysis.</p> <p>Conclusions</p> <p>Spatial normalization can potentially rescue thousands of SNPs in a genetic study at the small cost of computational time. The approach is implemented in R and available from the authors upon request.</p

    Analysis of time-to-event for observational studies: Guidance to the use of intensity models

    Full text link
    This paper provides guidance for researchers with some mathematical background on the conduct of time-to-event analysis in observational studies based on intensity (hazard) models. Discussions of basic concepts like time axis, event definition and censoring are given. Hazard models are introduced, with special emphasis on the Cox proportional hazards regression model. We provide check lists that may be useful both when fitting the model and assessing its goodness of fit and when interpreting the results. Special attention is paid to how to avoid problems with immortal time bias by introducing time-dependent covariates. We discuss prediction based on hazard models and difficulties when attempting to draw proper causal conclusions from such models. Finally, we present a series of examples where the methods and check lists are exemplified. Computational details and implementation using the freely available R software are documented in Supplementary Material. The paper was prepared as part of the STRATOS initiative.Comment: 28 pages, 12 figures. For associated Supplementary material, see http://publicifsv.sund.ku.dk/~pka/STRATOSTG8

    Relation of vertebral deformities to bone density, structure, and strength.

    Get PDF
    Because they are not reliably discriminated by areal bone mineral density (aBMD) measurements, it is unclear whether minimal vertebral deformities represent early osteoporotic fractures. To address this, we compared 90 postmenopausal women with no deformity (controls) with 142 women with one or more semiquantitative grade 1 (mild) deformities and 51 women with any grade 2-3 (moderate/severe) deformities. aBMD was measured by dual-energy X-ray absorptiometry (DXA), lumbar spine volumetric bone mineral density (vBMD) and geometry by quantitative computed tomography (QCT), bone microstructure by high-resolution peripheral QCT at the radius (HRpQCT), and vertebral compressive strength and load-to-strength ratio by finite-element analysis (FEA) of lumbar spine QCT images. Compared with controls, women with grade 1 deformities had significantly worse values for many bone density, structure, and strength parameters, although deficits all were much worse for the women with grade 2-3 deformities. Likewise, these skeletal parameters were more strongly associated with moderate to severe than with mild deformities by age-adjusted logistic regression. Nonetheless, grade 1 vertebral deformities were significantly associated with four of the five main variable categories assessed: bone density (lumbar spine vBMD), bone geometry (vertebral apparent cortical thickness), bone strength (overall vertebral compressive strength by FEA), and load-to-strength ratio (45-degree forward bending Ă· vertebral compressive strength). Thus significantly impaired bone density, structure, and strength compared with controls indicate that many grade 1 deformities do represent early osteoporotic fractures, with corresponding implications for clinical decision making

    Adjusted Survival Curves

    Get PDF

    Quality assessment metrics for whole genome gene expression profiling of paraffin embedded samples

    Get PDF
    BACKGROUND: Formalin fixed, paraffin embedded tissues are most commonly used for routine pathology analysis and for long term tissue preservation in the clinical setting. Many institutions have large archives of Formalin fixed, paraffin embedded tissues that provide a unique opportunity for understanding genomic signatures of disease. However, genome-wide expression profiling of Formalin fixed, paraffin embedded samples have been challenging due to RNA degradation. Because of the significant heterogeneity in tissue quality, normalization and analysis of these data presents particular challenges. The distribution of intensity values from archival tissues are inherently noisy and skewed due to differential sample degradation raising two primary concerns; whether a highly skewed array will unduly influence initial normalization of the data and whether outlier arrays can be reliably identified. FINDINGS: Two simple extensions of common regression diagnostic measures are introduced that measure the stress an array undergoes during normalization and how much a given array deviates from the remaining arrays post-normalization. These metrics are applied to a study involving 1618 formalin-fixed, paraffin-embedded HER2-positive breast cancer samples from the N9831 adjuvant trial processed with Illumina’s cDNA-mediated Annealing Selection extension and Ligation assay. CONCLUSION: Proper assessment of array quality within a research study is crucial for controlling unwanted variability in the data. The metrics proposed in this paper have direct biological interpretations and can be used to identify arrays that should either be removed from analysis all together or down-weighted to reduce their influence in downstream analyses

    3' tag digital gene expression profiling of human brain and universal reference RNA using Illumina Genome Analyzer

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Massive parallel sequencing has the potential to replace microarrays as the method for transcriptome profiling. Currently there are two protocols: full-length RNA sequencing (RNA-SEQ) and 3'-tag digital gene expression (DGE). In this preliminary effort, we evaluated the 3' DGE approach using two reference RNA samples from the MicroArray Quality Control Consortium (MAQC).</p> <p>Results</p> <p>Using Brain RNA sample from multiple runs, we demonstrated that the transcript profiles from 3' DGE were highly reproducible between technical and biological replicates from libraries constructed by the same lab and even by different labs, and between two generations of Illumina's Genome Analyzers. Approximately 65% of all sequence reads mapped to mitochondrial genes, ribosomal RNAs, and canonical transcripts. The expression profiles of brain RNA and universal human reference RNA were compared which demonstrated that DGE was also highly quantitative with excellent correlation of differential expression with quantitative real-time PCR. Furthermore, one lane of 3' DGE sequencing, using the current sequencing chemistry and image processing software, had wider dynamic range for transcriptome profiling and was able to detect lower expressed genes which are normally below the detection threshold of microarrays.</p> <p>Conclusion</p> <p>3' tag DGE profiling with massive parallel sequencing achieved high sensitivity and reproducibility for transcriptome profiling. Although it lacks the ability of detecting alternative splicing events compared to RNA-SEQ, it is much more affordable and clearly out-performed microarrays (Affymetrix) in detecting lower abundant transcripts.</p
    • 

    corecore